Neighborhoods as Nuisance Parameters? Robustness vs. Semiparametrics
نویسندگان
چکیده
Deviations from the center within a robust neighborhood may naturally be considered an infinite dimensional nuisance parameter. Thus, the semiparametric method may be tried, which is to compute the scores function for the main parameter minus its orthogonal projection on the closed linear tangent space for the nuisance parameter, and then rescale for Fisher consistency. We derive such a semiparametric influence curve by nonlinear projection on the tangent balls arising in robust statistics. This semiparametric influence curve is then compared with the optimally robust influence curve that minimizes maximum weighted mean square error of the corresponding asymptotically linear estimators over infinitesimal neighborhoods. While there is coincidence for Hellinger balls, at least clipping is achieved for total variation and contamination neighborhoods, but the semiparametric method in general falls short to solve the minimax MSE estimation problem for the gross error models. The semiparametric approach is carried further to testing contaminated hypotheses. In the one-sided case, for testing hypotheses defined by any two closed convex sets of tangents, a saddle point is furnished by projection on the set of differences of these sets. For total variation and contamination neighborhoods, we thus recover the robust asymptotic tests based on least favorable pairs.
منابع مشابه
Neighborhoods as Nuisance Parameters
Deviations from the center within a robust neighborhood may naturally be considered an innnite dimensional nuisance parameter. Thus, in principle, the semiparametric method may be tried, which is to compute the scores function for the main parameter minus its orthogonal projection on the closed linear tangent space for the nuisance parameter, and then rescale for Fisher consistency. We derive s...
متن کاملFawzi, Frossard: Measuring the Effect of Nuisance Variables
In real-world classification problems, nuisance variables can cause wild variability in the data. Nuisance corresponds for example to geometric distortions of the image, occlusions, illumination changes or any other deformations that do not alter the ground truth label of the image. It is therefore crucial that designed classifiers are robust to nuisance variables, especially when these are dep...
متن کاملOrthogonal Machine Learning: Power and Limitations
Double machine learning provides √ n-consistent estimates of parameters of interest even when highdimensional or nonparametric nuisance parameters are estimated at an n−1/4 rate. The key is to employ Neyman-orthogonal moment equations which are first-order insensitive to perturbations in the nuisance parameters. We show that the n−1/4 requirement can be improved to n−1/(2k+2) by employing a kth...
متن کاملImproving the Efficiency of the Log-rank Test Using Auxiliary Covariates
Under the assumption of proportional hazards, the log-rank test is optimal for testing the null hypothesis of H0 : β = 0, where β denotes the logarithm of the hazard ratio. However, if there are additional covariates that correlate with survival times, making use of their information will increase the efficiency of the log-rank test. We apply the theory of semiparametrics to characterize a clas...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000